seo

New Info from Google and Yahoo! Tilts the Geo-Targeting Balance

At the SMX Sydney conference in Australia this past week, search engineers Priyank Garg & Greg Grothaus (of Yahoo! & Google, respectively) shared information about duplicate content filtering across domains of which I and many of the other speakers/attendees were previously unaware.

Priyank, when asked about best practices for “localizing” English language content across domains, noted that Yahoo! does not filter duplicate content out of their results when the works are found on multiple ccTLD domains. Greg confirmed that this is also how Google’s engine behaves and, with the exception of potentially spammy or manipulative sites, reproducing the same content on, for example, yoursite.com, yoursite.co.uk and yoursite.com.au was perfectly acceptable and shouldn’t trigger removal for duplicate content (assuming those sites are properly targeting their individual geographic regions).

This (seemingly) came as a surprise to the audience, along with noted experts on the topic, David Temple (Head of Search Marketing for Ogilvy in Singapore), Cindy Krum (Founder of Rank Mobile) and Rob Kerry (Head of Search for Ayima). According to Priyank, this should not have come as a surprise, as he felt this best practice recommendation had been previously messaged (thus, this may not be new information for everyone). Greg was less clear about whether this was “new” information from Google, but supported the use of content across shared-language domains in this fashion.

For those a bit confused, I’ve created this quick comic/illustration:

Β Comic Illustrating Geo-Targeting the ccTLD Search Engines Properly

In my opinion, this shifts the balance quite a bit in favor of creating separate, country-specific top level domains when geo-targeting in each specific region is importantΒ to the brand/organization. However, I have to balance thatΒ against my personal experience withΒ country-targeted domains for small and medium size businesses.Β The engines place anΒ enormous amount of value on domain authority, and it’s been our experience that many smallerΒ sites thatΒ launch overseas rank worseΒ with their geo-targeted domain than they did with a region agnostic .com or .org.

As an example, SEOmoz.org gets significant traffic from English-language searches outside the US, and our domain trust/authority/history/link metrics help to bolster that. If we were to launch seomoz.co.uk or seomoz.com.au and replicate our content on those domains, then restrict geo-targeting (in Google’s Webmaster Tools and through IP-based redirection, for example), I wouldn’t be surprised to see that the rankings outside the US (.com version) would suffer dramatically (until and unless the UK/Australian domains gained similar levels of link popularity/importance). However, if we were a much larger brand – an Expedia, AT&T, Tesco, etc. it might make much more sense to localize and build individually to help get the benefits of the local preference algorithms (that favor geographic proximity) in the engines.

BTW – All the panelists noted that it would still be best practice from a usability and conversion standpoint to localize language usage and create a customized experience for each audience. It’s just that search engines won’t block those pages from inclusion in their indices simply for content overlap.

As always, looking forward to your thoughts on the issue (and apologies for errors in writing, I’ve just landed in Los Angeles after a 13 hour flight from Sydney, and had to grab a hotel as I foolishly booked my flight to Seattle for tomorrow morning – doh!).

p.s. Folks in the comments have pointed out that Vanessa Fox has previously discussed this issue with great detail, and that Neerav from Bruce Clay Australia has some good coverage of this particular session. I’m also a fan of Kalena Jordan’s coverage of SMX, though it didn’t hit this particular topic.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button